7,113 research outputs found

    Favoring Generalists over Specialists: How Attentional Biasing Improves Perceptual Category Learning

    Full text link
    A model of cortical learning is proposed, which incorporates supervised feedback using two forms of attention: (i) feature-specific attention which allows the network to learn associations between specific feature conjunctions (or categories) and outputs, and (ii) nonspecific attentional "vigilance" which biases this learning when the associations appear to be incorrect. Attentional vigilance improves learning if it favors, via modulatory weights, generalist categories over specialist categories. A biologically plausible neural network is proposed which implements these computational principles and which outperforms several classifiers on classification benchmarks.Defense Advanced Research Projects Agency and Office of Naval Research (N0014-95-1-0409

    A Constructive, Incremental-Learning Network for Mixture Modeling and Classification

    Full text link
    Gaussian ARTMAP (GAM) is a supervised-learning adaptive resonance theory (ART) network that uses Gaussian-defined receptive fields. Like other ART networks, GAM incrementally learns and constructs a representation of sufficient complexity to solve a problem it is trained on. GAM's representation is a Gaussian mixture model of the input space, with learned mappings from the mixture components to output classes. We show a close relationship between GAM and the well-known Expectation-Maximization (EM) approach to mixture-modeling. GAM outperforms an EM classification algorithm on a classification benchmark, thereby demonstrating the advantage of the ART match criterion for regulating learning, and the ARTMAP match tracking operation for incorporate environmental feedback in supervised learning situations.Office of Naval Research (N00014-95-1-0409

    A Constructive, Incremental-Learning Network for Mixture Modeling and Classification

    Full text link
    Gaussian ARTMAP (GAM) is a supervised-learning adaptive resonance theory (ART) network that uses Gaussian-defined receptive fields. Like other ART networks, GAM incrementally learns and constructs a representation of sufficient complexity to solve a problem it is trained on. GAM's representation is a Gaussian mixture model of the input space, with learned mappings from the mixture components to output classes. We show a close relationship between GAM and the well-known Expectation-Maximization (EM) approach to mixture-modeling. GAM outperforms an EM classification algorithm on a classification benchmark, thereby demonstrating the advantage of the ART match criterion for regulating learning, and the ARTMAP match tracking operation for incorporate environmental feedback in supervised learning situations.Office of Naval Research (N00014-95-1-0409

    Dynamic Binding of Visual Contours Without Temporal Coding

    Full text link
    Recognition of objects in complex visual scenes is greatly simplified by the ability to segment features belonging to different objects while grouping features belonging to the same object. This feature-binding process can be driven by the local relations between visual contours. The standard method for implementing this process with neural networks uses a temporal code to bind features together. I propose a spatial coding alternative for the dynamic binding of visual contours, and demonstrate the spatial coding method for segmenting an image consisting of three overlapping objects.Office of Naval Research (N00014-91-J-4100

    Comparison of Gaussian ARTMAP and the EM Algorithm

    Full text link
    Gaussian ARTMAP (GAM) is a supervised-learning adaptive resonance theory (ART) network that uses Gaussian-defined receptive fields. Like other ART networks, GAM incrementally learns and constructs a representation of sufficient complexity to solve a problem it is trained on. GAM's representation is a Gaussian mixture model of the input space, with learned mappings from the mixture components to output classes. We show a close relationship between GAM and the well-known Expectation-Maximization (EM) approach to mixture-modeling. GAM outperforms an EM classification algorithm on a classification benchmark, thereby demonstrating the advantage of the ART match criterion for regulating learning, and the ARTMAP match tracking operation for incorporate environmental feedback in supervised learning situations.Office of Naval Research (N00014-95-1-0409

    A Neural Model for Self Organizing Feature Detectors and Classifiers in a Network Hierarchy

    Full text link
    Many models of early cortical processing have shown how local learning rules can produce efficient, sparse-distributed codes in which nodes have responses that are statistically independent and low probability. However, it is not known how to develop a useful hierarchical representation, containing sparse-distributed codes at each level of the hierarchy, that incorporates predictive feedback from the environment. We take a step in that direction by proposing a biologically plausible neural network model that develops receptive fields, and learns to make class predictions, with or without the help of environmental feedback. The model is a new type of predictive adaptive resonance theory network called Receptive Field ARTMAP, or RAM. RAM self organizes internal category nodes that are tuned to activity distributions in topographic input maps. Each receptive field is composed of multiple weight fields that are adapted via local, on-line learning, to form smooth receptive ftelds that reflect; the statistics of the activity distributions in the input maps. When RAM generates incorrect predictions, its vigilance is raised, amplifying subtractive inhibition and sharpening receptive fields until the error is corrected. Evaluation on several classification benchmarks shows that RAM outperforms a related (but neurally implausible) model called Gaussian ARTMAP, as well as several standard neural network and statistical classifters. A topographic version of RAM is proposed, which is capable of self organizing hierarchical representations. Topographic RAM is a model for receptive field development at any level of the cortical hierarchy, and provides explanations for a variety of perceptual learning data.Defense Advanced Research Projects Agency and Office of Naval Research (N00014-95-1-0409

    Self-Organization of Topographic Mixture Networks Using Attentional Feedback

    Full text link
    This paper proposes a biologically-motivated neural network model of supervised learning. The model possesses two novel learning mechanisms. The first is a network for learning topographic mixtures. The network's internal category nodes are the mixture components, which learn to encode smooth distributions in the input space by taking advantage of topography in the input feature maps. The second mechanism is an attentional biasing feedback circuit. When the network makes an incorrect output prediction, this feedback circuit modulates the learning rates of the category nodes, by amounts based on the sharpness of their tuning, in order to improve the network's prediction accuracy. The network is evaluated on several standard classification benchmarks and shown to perform well in comparison to other classifiers. Possible relationships are discussed between the network's learning properties and those of biological neural networks. Possible future extensions of the network are also discussed.Defense Advanced Research Projects Agency and the Office of Naval Research (N00014-95-1-0409

    The First Encounter: Fighting for Naval Supremacy on Lake Ontario, 7–10 August 1813

    Get PDF
    To upgrade the fighting ability of the Provincial Marine, the Royal Navy sent one of their best young commodores along with 465 officers and ratings to operate the ships of the Lake Ontario squadron. This detachment of Royal Navy personnel, including four commanders, were all veterans with a wealth of sea experience. Commodore Sir James Lucas Yeo was described as a zealous, enterprising officer whose daring was unequalled in the annuals of the Royal Navy. Hence his rapid rise to flag rank and his knighthood at the age of thirty-one. The purpose of this article is to illustrate that the way in which Chauncey and Yeo conducted their operations on Lake Ontario was very much in keeping with their background and experience. It was evident from their first encounter that Yeo, the veteran, was the confident aggressor while Chauncey, the administrator, was wary of the reputation of his knighted opponent and unsure of his own squadron’s capabilities

    Neural Network for Dynamic Binding with Graph Representation: Form, Linking, and Depth-From-Occlusion

    Full text link
    A neural network is presented which explicity represents form attributes and relations between them, thus solving the binding problem without temporal coding. Rather, the network creates a graph representation by dynamically allocating nodes to code local form attributes and establishing ares to link. With this representation, the network selectivly groups and segments in depth objects based on line junction information, producing results consistent with those of several recent visual search eperiments. In addiction to depth-from-occlusion, the network provides a sufficient framework for local line-labelling processes to recover other 3-D variables, such as edge/surface contiguity, edge, slant, and edge convexity.Air Force Office of Scientific Research (F49620-92-J-0225); National Science Foundation (IRI-90-24877, IRI-90-00530); Office of Naval Research (N0014-91-J-4100, N00014-92-J-4015

    Gaussian Artmap: A Neural Network for Fast Incremental Learning of Noisy Multidimensional Maps

    Full text link
    A new neural network architecture for incremental supervised learning of analalog multidimensional maps is introduced. The architecture, called Gaussian ARTMAP, is a synthesis of a Gaussian classifier and an Adaptive Resonance Theory (ART) neural network, achieved by defining the ART choice function as the discriminant function of a Gaussian classifer with separable distributions, and the ART match function as the same, but with the a priori probabilities of the distributions discounted. While Gaussian ARTMAP retains the attractive parallel computing and fast learning properties of fuzzy ARTMAP, it learns a more efficient internal representation of a mapping while being more resistant to noise than fuzzy ARTMAP on a number of benchmark databases. Several simulations are presented which demonstrate that Gaussian ARTMAP consistently obtains a better trade-off of classification rate to number of categories than fuzzy ARTMAP. Results on a vowel classiflcation problem are also presented which demonstrate that Gaussian ARTMAP outperforms many other classifiers.National Science Foundation (IRI 90-00530); Office of Naval Research (N00014-92-J-4015, 40014-91-J-4100
    • …
    corecore